Introduction: Acute lymphoblastic leukemia (ALL) remains a rare but aggressive hematologic malignancy among older adults, with limited population-level data characterizing its long-term mortality burden in the United States. This study aimed to evaluate age-adjusted mortality trends from ALL as the underlying cause of death among adults aged ≥65 years across various demographic and geographical variables in the U.S. from 1968 to 2023, with projections through 2050.Methods: ALL-related deaths were identified using ICD-8/9 code 204.0 and ICD-10 code C91.0 from the CDC WONDER database. Age-adjusted mortality rates (AAMRs) per 100,000 were calculated annually. Temporal trends were assessed using Joinpoint regression to estimate average annual percent change (AAPC) with 95% confidence intervals. Forecasts through 2050 were generated using ARIMA models with Box-Cox transformation, following stationarity testing via the Augmented Dickey-Fuller (ADF) and Kwiatkowski-Phillips-Schmidt-Shin (KPSS) tests. Model performance was validated using residual diagnostics and rolling-origin cross-validation.Results: From 1968 to 2023, a total of 26,102 deaths were attributed to ALL as the underlying cause among U.S. adults aged ≥65. The overall age-adjusted mortality rate (AAMR) was 1.47 per 100,000. Joinpoint regression identified three inflection points (1976, 1989, 2002). AAMRs declined from 1968–1976 (APC: –7.90%, 95% CI: –10.2 to –5.5, p < 0.001), plateaued 1976–1989 (APC: +0.63%, 95% CI: –0.3 to 1.6, p = 0.205), declined again 1989–2002 (APC: –2.41%, 95% CI: –3.0 to –1.8, p < 0.001), and stabilized 2002–2023 (APC: +0.08%, 95% CI: –0.4 to 0.6, p = 0.768). The overall trend showed a significant decline (AAPC: –1.58%, 95% CI: –1.9 to –1.2, p < 0.001). Mortality rates were highest among males (1.90 per 100,000) compared to females (1.20 per 100,000). Both genders exhibited significant overall declines; males (AAPC: –1.79%, 95% CI: –2.2 to –1.2, p < 0.00001) and females (AAPC: –1.45%, 95% CI: –2.08 to –0.81, p < 0.05). However, recent decade-specific trends were non-significant in both groups, indicating a plateau. Regionally, the highest AAMR was observed in the Midwest (1.64 per 100,000), followed by the South (1.49), West (1.44), West (1.44), and Northeast (1.29). All regions exhibited significant declines in AAMRs over time. The Crude Mortality Rate was lowest among those aged 65–74 years, at 1.02 deaths per 100,000, followed by 1.78 per 100,000 in the 75–84 age group. The highest crude mortality rate was observed in adults aged ≥85 years, reaching 2.50 per 100,000. ARIMA-based projections indicate that the AAMR for ALL among U.S. adults aged ≥65 years will continue its downward trajectory, decreasing from 1.23 per million in 2024 (95% CI: 1.06–1.45) to 0.85 per million by 2050 (95% CI: 0.53–1.49), reflecting an approximate 31% relative decline. Although the 95% confidence intervals widen modestly over time, the projected decline in AAMR remains statistically consistent with a sustained reduction through 2050.

Conclusion: Among U.S. adults aged ≥65 years, age-adjusted mortality rates (AAMRs) for acute lymphoblastic leukemia (ALL) have declined significantly over the past five decades, with an overall average annual percent change of –1.58%. Despite periods of plateau, recent trends suggest a stabilization rather than continued improvement. Males and older age groups bear a higher mortality burden, though all subgroups experienced significant long-term reductions. Furthermore, the age-dependent gradient highlights the increasing burden of ALL mortality with advancing age. Regionally, the Midwest reported the highest mortality, but all regions showed meaningful declines. ARIMA-based projections indicate a relative decline in AAMR by 2050, reinforcing the historical downward trajectory.

This content is only available as a PDF.
Sign in via your Institution